Second-Order Kernel Online Convex Optimization with Adaptive Sketching

نویسندگان

  • Daniele Calandriello
  • Alessandro Lazaric
  • Michal Valko
چکیده

Kernel online convex optimization (KOCO) is a framework combining the expressiveness of nonparametric kernel models with the regret guarantees of online learning. First-order KOCO methods such as functional gradient descent require onlyO(t) time and space per iteration, and, when the only information on the losses is their convexity, achieve a minimax optimal O( √ T ) regret. Nonetheless, many common losses in kernel problems, such as squared loss, logistic loss, and squared hinge loss posses stronger curvature that can be exploited. In this case, second-order KOCO methods achieveO(log(Det(K))) regret, which we show scales as O(deff log T ), where deff is the effective dimension of the problem and is usually much smaller than O( √ T ). The main drawback of second-order methods is their much higher O(t) space and time complexity. In this paper, we introduce kernel online Newton step (KONS), a new second-order KOCO method that also achievesO(deff log T ) regret. To address the computational complexity of second-order methods, we introduce a new matrix sketching algorithm for the kernel matrix Kt, and show that for a chosen parameter γ ≤ 1 our Sketched-KONS reduces the space and time complexity by a factor of γ toO(tγ) space and time per iteration, while incurring only 1/γ times more regret.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Interior Point Algorithm for Solving Convex Quadratic Semidefinite Optimization Problems Using a New Kernel Function

In this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual Interior Point Method (IPM) based on a new kernel function with a trigonometric barrier term. Iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. Although our proposed kernel function is neither a Self-Regular (SR) fun...

متن کامل

Online Learning Via Regularized Frequent Directions

Online Newton step algorithms usually achieve good performance with less training samples than first order methods, but require higher space and time complexity in each iteration. In this paper, we develop a new sketching strategy called regularized frequent direction (RFD) to improve the performance of online Newton algorithms. Unlike the standard frequent direction (FD) which only maintains a...

متن کامل

Distributed online second-order dynamics for convex optimization over switching connected graphs

This paper studies the regret of a family of distributed algorithms for online convex unconstrained optimization. A team of agents cooperate in a decision making process enabled by local interactions and the knowledge of each agent about the local cost functions associated with its decisions in previous rounds. We propose a class of online, second-order distributed coordination algorithms that ...

متن کامل

Fast Kernel Learning using Sequential Minimal Optimization

While classical kernel-based classifiers are based on a single kernel, in practice it is often desirable to base classifiers on combinations of multiple kernels. Lanckriet et al. (2004) considered conic combinations of kernel matrices for the support vector machine (SVM), and showed that the optimization of the coefficients of such a combination reduces to a convex optimization problem known as...

متن کامل

CVaR Reduced Fuzzy Variables and Their Second Order Moments

Based on credibilistic value-at-risk (CVaR) of regularfuzzy variable, we introduce a new CVaR reduction method fortype-2 fuzzy variables. The reduced fuzzy variables arecharacterized by parametric possibility distributions. We establishsome useful analytical expressions for mean values and secondorder moments of common reduced fuzzy variables. The convex properties of second order moments with ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017